23 research outputs found
Evolution of 3D Boson Stars with Waveform Extraction
Numerical results from a study of boson stars under nonspherical
perturbations using a fully general relativistic 3D code are presented together
with the analysis of emitted gravitational radiation. We have constructed a
simulation code suitable for the study of scalar fields in space-times of
general symmetry by bringing together components for addressing the initial
value problem, the full evolution system and the detection and analysis of
gravitational waves. Within a series of numerical simulations, we explicitly
extract the Zerilli and Newman-Penrose scalar gravitational waveforms
when the stars are subjected to different types of perturbations. Boson star
systems have rapidly decaying nonradial quasinormal modes and thus the complete
gravitational waveform could be extracted for all configurations studied. The
gravitational waves emitted from stable, critical, and unstable boson star
configurations are analyzed and the numerically observed quasinormal mode
frequencies are compared with known linear perturbation results. The
superposition of the high frequency nonspherical modes on the lower frequency
spherical modes was observed in the metric oscillations when perturbations with
radial and nonradial components were applied. The collapse of unstable boson
stars to black holes was simulated. The apparent horizons were observed to be
slightly nonspherical when initially detected and became spherical as the
system evolved. The application of nonradial perturbations proportional to
spherical harmonics is observed not to affect the collapse time. An unstable
star subjected to a large perturbation was observed to migrate to a stable
configuration.Comment: 26 pages, 12 figure
The Dark Energy Survey Data Management System
The Dark Energy Survey collaboration will study cosmic acceleration with a
5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The
DES data management (DESDM) system will be used to process and archive these
data and the resulting science ready data products. The DESDM system consists
of an integrated archive, a processing framework, an ensemble of astronomy
codes and a data access framework. We are developing the DESDM system for
operation in the high performance computing (HPC) environments at NCSA and
Fermilab. Operating the DESDM system in an HPC environment offers both speed
and flexibility. We will employ it for our regular nightly processing needs,
and for more compute-intensive tasks such as large scale image coaddition
campaigns, extraction of weak lensing shear from the full survey dataset, and
massive seasonal reprocessing of the DES data. Data products will be available
to the Collaboration and later to the public through a virtual-observatory
compatible web portal. Our approach leverages investments in publicly available
HPC systems, greatly reducing hardware and maintenance costs to the project,
which must deploy and maintain only the storage, database platforms and
orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we
tested the current DESDM system on both simulated and real survey data. We used
Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and
calibrating approximately 250 million objects into the DES Archive database. We
also used DESDM to process and calibrate over 50 nights of survey data acquired
with the Mosaic2 camera. Comparison to truth tables in the case of the
simulated data and internal crosschecks in the case of the real data indicate
that astrometric and photometric data quality is excellent.Comment: To be published in the proceedings of the SPIE conference on
Astronomical Instrumentation (held in Marseille in June 2008). This preprint
is made available with the permission of SPIE. Further information together
with preprint containing full quality images is available at
http://desweb.cosmology.uiuc.edu/wik
The Dark Energy Survey Data Processing and Calibration System
The Dark Energy Survey (DES) is a 5000 deg2 grizY survey reaching
characteristic photometric depths of 24th magnitude (10 sigma) and enabling
accurate photometry and morphology of objects ten times fainter than in SDSS.
Preparations for DES have included building a dedicated 3 deg2 CCD camera
(DECam), upgrading the existing CTIO Blanco 4m telescope and developing a new
high performance computing (HPC) enabled data management system (DESDM).
The DESDM system will be used for processing, calibrating and serving the DES
data. The total data volumes are high (~2PB), and so considerable effort has
gone into designing an automated processing and quality control system. Special
purpose image detrending and photometric calibration codes have been developed
to meet the data quality requirements, while survey astrometric calibration,
coaddition and cataloging rely on new extensions of the AstrOmatic codes which
now include tools for PSF modeling, PSF homogenization, PSF corrected model
fitting cataloging and joint model fitting across multiple input images.
The DESDM system has been deployed on dedicated development clusters and HPC
systems in the US and Germany. An extensive program of testing with small rapid
turn-around and larger campaign simulated datasets has been carried out. The
system has also been tested on large real datasets, including Blanco Cosmology
Survey data from the Mosaic2 camera. In Fall 2012 the DESDM system will be used
for DECam commissioning, and, thereafter, the system will go into full science
operations.Comment: 12 pages, submitted for publication in SPIE Proceeding 8451-1
LSST: from Science Drivers to Reference Design and Anticipated Data Products
(Abridged) We describe here the most ambitious survey currently planned in
the optical, the Large Synoptic Survey Telescope (LSST). A vast array of
science will be enabled by a single wide-deep-fast sky survey, and LSST will
have unique survey capability in the faint time domain. The LSST design is
driven by four main science themes: probing dark energy and dark matter, taking
an inventory of the Solar System, exploring the transient optical sky, and
mapping the Milky Way. LSST will be a wide-field ground-based system sited at
Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m
effective) primary mirror, a 9.6 deg field of view, and a 3.2 Gigapixel
camera. The standard observing sequence will consist of pairs of 15-second
exposures in a given field, with two such visits in each pointing in a given
night. With these repeats, the LSST system is capable of imaging about 10,000
square degrees of sky in a single filter in three nights. The typical 5
point-source depth in a single visit in will be (AB). The
project is in the construction phase and will begin regular survey operations
by 2022. The survey area will be contained within 30,000 deg with
, and will be imaged multiple times in six bands, ,
covering the wavelength range 320--1050 nm. About 90\% of the observing time
will be devoted to a deep-wide-fast survey mode which will uniformly observe a
18,000 deg region about 800 times (summed over all six bands) during the
anticipated 10 years of operations, and yield a coadded map to . The
remaining 10\% of the observing time will be allocated to projects such as a
Very Deep and Fast time domain survey. The goal is to make LSST data products,
including a relational database of about 32 trillion observations of 40 billion
objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures
available from https://www.lsst.org/overvie
The ALICE experiment at the CERN LHC
ALICE (A Large Ion Collider Experiment) is a general-purpose, heavy-ion detector at the CERN LHC which focuses on QCD, the strong-interaction sector of the Standard Model. It is designed to address the physics of strongly interacting matter and the quark-gluon plasma at extreme values of energy density and temperature in nucleus-nucleus collisions. Besides running with Pb ions, the physics programme includes collisions with lighter ions, lower energy running and dedicated proton-nucleus runs. ALICE will also take data with proton beams at the top LHC energy to collect reference data for the heavy-ion programme and to address several QCD topics for which ALICE is complementary to the other LHC detectors. The ALICE detector has been built by a collaboration including currently over 1000 physicists and engineers from 105 Institutes in 30 countries. Its overall dimensions are 161626 m3 with a total weight of approximately 10 000 t. The experiment consists of 18 different detector systems each with its own specific technology choice and design constraints, driven both by the physics requirements and the experimental conditions expected at LHC. The most stringent design constraint is to cope with the extreme particle multiplicity anticipated in central Pb-Pb collisions. The different subsystems were optimized to provide high-momentum resolution as well as excellent Particle Identification (PID) over a broad range in momentum, up to the highest multiplicities predicted for LHC. This will allow for comprehensive studies of hadrons, electrons, muons, and photons produced in the collision of heavy nuclei. Most detector systems are scheduled to be installed and ready for data taking by mid-2008 when the LHC is scheduled to start operation, with the exception of parts of the Photon Spectrometer (PHOS), Transition Radiation Detector (TRD) and Electro Magnetic Calorimeter (EMCal). These detectors will be completed for the high-luminosity ion run expected in 2010. This paper describes in detail the detector components as installed for the first data taking in the summer of 2008
ALICE: Physics performance report, volume I
ALICE is a general-purpose heavy-ion experiment designed to study the physics of strongly interacting matter and the quark-gluon plasma in nucleus-nucleus collisions at the LHC. It currently includes more than 900 physicists and senior engineers, from both nuclear and high-energy physics, from about 80 institutions in 28 countries. The experimentwas approved in February 1997. The detailed design of the different detector systems has been laid down in a number of Technical Design Reports issued between mid-1998 and the end of 2001 and construction has started for most detectors. Since the last comprehensive information on detector and physics performance was published in the ALICE Technical Proposal in 1996, the detector as well as simulation, reconstruction and analysis software have undergone significant development. The Physics Performance Report (PPR) will give an updated and comprehensive summary of the current status and performance of the various ALICE subsystems, including updates to the Technical Design Reports, where appropriate, as well as a description of systems which have not been published in a Technical Design Report. The PPR will be published in two volumes. The currentVolume I contains: 1. a short theoretical overview and an extensive reference list concerning the physics topics of interest to ALICE, 2. relevant experimental conditions at the LHC, 3. a short summary and update of the subsystem designs, and 4. a description of the offline framework and Monte Carlo generators. Volume II, which will be published separately, will contain detailed simulations of combined detector performance, event reconstruction, and analysis of a representative sample of relevant physics observables from global event characteristics to hard processes. © 2004 IOP Publishing Ltd